Fundamental Inequalities

Inequalities that are closely related to Entropy and Mutual Information.

1 Fundamental Inequality

$$ \begin{aligned} \ln x &\in \left[ 1 - \frac{1}{x}, x - 1 \right], \log_b x = \log_b e \cdot \ln x \\ \log x &\in \left[ (1 - \frac{1}{x}) \log e, (x - 1)\log e \right] \end{aligned} $$

2 Range of Entropy

$$ \begin{aligned} 0 \leq H(X) \leq |\mathcal{X}| \end{aligned} $$

3 Conditioning Reduces Entropy

$$ \begin{aligned} D(p||q) &\geq 0 &[-D(p||q) \leq 0, \text{ by Fundamental Inq.}] \\ I(X; Y) &\geq 0 &[I(X; Y) = D(p_{X, Y} || p_{X} p_{Y})] \\ H(X) &\geq H(X|Y) &[H(X) - H(X | Y) \geq 0]\\ H(Y) &\geq H(Y|X) &[H(Y) - H(X | Y) \geq 0] \\ \end{aligned} $$

Be cautious: $$ \begin{aligned} H(X | Y = y) &> H(X) &\text{[Possible $\checkmark$]} \end{aligned} $$

4 Bound of Joint Entropy

By chain rule and that conditioning reduces entropy: $$ H(X_1, \dots, H_n) \leq \sum_{i=1}^{n} H(X_i) $$

5 Jensen's Inequality

If $ f $ is convex over $ \mathcal{I} $, and $ \mathcal{X} \subset \mathcal{I} $: $$ \begin{aligned} E[f(X)] \geq f(E[X]) \end{aligned} $$

by Jon